XClose

Gatsby Computational Neuroscience Unit

Home
Menu

Michalis Titsias

 

Wednesday 12th July 2017

 

Time: 4pm

 

Ground Floor Seminar Room

25 Howland Street, London, W1T 4JG

 

Reparametrize MCMC: Implicit Variational Inference by Fitting Marginal
MCMC distributions

We introduce a new framework for approximate probabilistic inference
that combines variational methods with Markov chain Monte Carlo
(MCMC). Specifically we construct a very flexible implicit variational
distribution synthesized by an arbitrary black-box MCMC operation and
a reparametrizable deterministic transformation. We show how to
efficiently optimize this implicit distribution using the
reparametrization trick and a Monte Carlo Expectation Maximization
algorithm. Unlike current methods for implicit variational inference,
our method avoids completely the troublesome computation of log
density ratios and therefore it is easily applicable to any continuous
and differentiable models. We will also discuss a second class of MCMC
and variational inference hybrids based on the idea "learn how to
start to speed up convergence". This second class is suitable for
discrete models and it makes use of the log derivative trick.